estimator api
How to train Boosted Trees models in TensorFlow
Tree ensemble methods such as gradient boosted decision trees and random forests are among the most popular and effective machine learning tools available when working with structured data. Tree ensemble methods are fast to train, work well without a lot of tuning, and do not require large datasets to train on. In TensorFlow, gradient boosted trees are available using the tf.estimator API, which also supports deep neural networks, wide-and-deep models, and more. For boosted trees, regression with pre-defined mean squared error loss (BoostedTreesRegressor) and classification with cross entropy loss (BoostedTreesClassifier) are supported.
Multi-GPU training with Estimators, tf.keras and tf.data
At Zalando Research, as in most AI research departments, we realize the importance of experimenting and quickly prototyping ideas. With datasets getting bigger it thus becomes useful to know how to train deep learning models quickly and efficiently on the shared resources we have. TensorFlow's Estimators API is useful for training models in a distributed environment with multiple GPUs. Here, we'll present this workflow by training a custom estimator written with tf.keras for the tiny Fashion-MNIST dataset, and then show a more practical use case at the end. Note: there's also a cool new feature the TensorFlow team has been working on, (which at the time of writing is still in master), that lets you train a tf.keras model without first needing to convert it to an Estimator, with just a couple lines of additional code!
TensorFlow Estimator & Dataset APIs – Peter Roelants – Medium
When TensorFlow 1.3 was released the Estimator, and related high-level APIs, caught my eye. This is almost a year ago and TensorFlow has had a few updates, with 1.8 the latest version at the time of writing this. Time to revisit these APIs and see how they evolved. The Estimator and Dataset APIs have become more mature since TF 1.3. The Estimator API provides a top-level abstraction and integrates nicely with other APIs such as the Dataset API to build input streams, and the Layers API to build model architectures.
Machine Learning as a Service with TensorFlow – freeCodeCamp
Imagine this: you've gotten aboard the AI Hype Train and decided to develop an app which will analyze the effectiveness of different chopstick types. To monetize this mind-blowing AI application and impress VC's, we'll need to open it to the world. And it'd better be scalable as everyone will want to use it. As a starting point, we will use this dataset, which contains measurements of food pinching efficiency of various individuals with chopsticks of different length. As we are not only data scientists but also responsible software engineers, we'll first draft out our architecture.
How to do time series prediction using RNNs, TensorFlow and Cloud ML Engine - Dataconomy
The really cool thing from my perspective about the Estimators API is that using it is a very easy way to create distributed TensorFlow models. Many of the TensorFlow samples that you see floating around on the internets are not distributed -- they assume that you will be running the code on a single machine. People start with such code and then are immeasurably saddened to learn that the low-level TensorFlow code doesn't actually work on their complete dataset. They then have to do lots of work to add distributed training code around the original sample, and who wants to edit somebody else's code? So, please, please, please, if you see a TensorFlow sample that doesn't use the Estimators API, ignore it.